A Latent Discriminative Model for Compositional Entailment Relation Recognition using Natural Logic
نویسندگان
چکیده
Recognizing semantic relations between sentences, such as entailment and contradiction, is a challenging task that requires detailed analysis of the interaction between diverse linguistic phenomena. In this paper, we propose a latent discriminative model that unifies a statistical framework and a theory of Natural Logic to capture complex interactions between linguistic phenomena. The proposed approach jointly models alignments, their local semantic relations, and a sentence-level semantic relation, and has hidden variables including alignment edits between sentences and their semantic relations, only requires sentences pairs annotated with sentence-level semantic relations as training data to learn appropriate alignments. In evaluation on a dataset including diverse linguistic phenomena, our proposed method achieved a competitive results on alignment prediction, and significant improvements on a sentence-level semantic relation recognition task compared to an alignment supervised model. Our analysis did not provide evidence that directly learning alignments and their labels using gold standard alignments contributed to semantic relation recognition performance and instead suggests that they can be detrimental to performance if used in a manner that prevents the learning of globally optimal alignments.
منابع مشابه
THK's Natural Logic-based Compositional Textual Entailment Model at NTCIR-10 RITE-2
This paper describes the THK system that participated in the BC subtask, MC subtask, ExamBC subtask and UnitTest in NTCIR-10 RITE-2. Our system learns plausible transformations of pairs of Text t1 and Hypothesis t2 only from semantic labels of the pairs using a discriminative probabilistic model combined with the framework of Natural Logic. The model is trained so as to prefer alignments and th...
متن کاملGraded Entailment for Compositional Distributional Semantics
The categorical compositional distributional model of natural language provides a conceptually motivated procedure to compute the meaning of sentences, given grammatical structure and the meanings of its words. This approach has outperformed other models in mainstream empirical language processing tasks. However, until now it has lacked the crucial feature of lexical entailment – as do other di...
متن کاملSemantic and Logical Inference Model for Textual Entailment
We compare two approaches to the problem of Textual Entailment: SLIM, a compositional approach modeling the task based on identifying relations in the entailment pair, and BoLI, a lexical matching algorithm. SLIM’s framework incorporates a range of resources that solve local entailment problems. A search-based inference procedure unifies these resources, permitting them to interact flexibly. Bo...
متن کاملVariational Neural Discourse Relation Recognizer
Implicit discourse relation recognition is a crucial component for automatic discourse-level analysis and nature language understanding. Previous studies exploit discriminative models that are built on either powerful manual features or deep discourse representations. In this paper, instead, we explore generative models and propose a variational neural discourse relation recognizer. We refer to...
متن کاملNatural logic and natural language inference
We propose a model of natural language inference which identifies valid inferences by their lexical and syntactic features, without full semantic interpretation. We extend past work in natural logic, which has focused on semantic containment and monotonicity, by incorporating both semantic exclusion and implicativity. Our model decomposes an inference problem into a sequence of atomic edits lin...
متن کامل